ML-MOEA/SOM: A Manifold-Learning-Based Multiobjective Evolutionary Algorithm Via Self-Organizing Maps
نویسندگان
چکیده
Under mild conditions, it can be induced from the Karush–Kuhn–Tucker condition that the Pareto set, in the decision space, of a continuous Multiobjective Optimization Problems(MOPs) is a piecewise continuous ( 1) m D manifold(where m is the number of objectives). One hand, the traditional Multiobjective Optimization Algorithms(EMOAs) cannot utilize this regularity property; on the other hand, the Regular Model-Based Multiobjective Estimation of Distribution Algorithm(RM-MEDA) only able to build the linear model of decision space using linear modelling algorithm, such as: the local principal component analysis algorithm(Local PCA).Aim at the shortcomings of EMOAs and RM-MEDA, the Manifold-Learning-Based Multiobjective Evolutionary Algorithm Via Self-Organizing Maps(ML-MOEA/SOM) is proposed for continuous multiobjective optimization problems. At each generation, first, via Self-Organizing Maps, the proposed algorithm learns such a nonlinear manifold in the decision space; then, new trial solutions is built through expanding the neurons of SOM with random noise; at the end, a nondominated sorting-based selection is used for choosing solutions for the next generation. Systematic experiments have shown that, overall, ML-MOEA/SOM outperforms NSGA-II, and is competitive with RM-MEDA in terms of convergence and diversity, on a set of test instances with variable linkages. We have demonstrated that, compared with NSGA-II and RM-MEDA, via self-Organizing maps, ML-MOEA/SOM can dig nonlinear manifold hidden in the decision space of multiobjective optimization problems.
منابع مشابه
Density Estimation by Mixture Models with Smoothing Priors
In the statistical approach for self-organizing maps (SOMs), learning is regarded as an estimation algorithm for a gaussian mixture model with a gaussian smoothing prior on the centroid parameters. The values of the hyperparameters and the topological structure are selected on the basis of a statistical principle. However, since the component selection probabilities are fixed to a common value,...
متن کاملLetters 1 Decomposition - Based Multiobjective Evolutionary 2 Algorithm with an Ensemble of Neighborhood Sizes 3
The multiobjective evolutionary algorithm based on de6 composition (MOEA/D) has demonstrated superior performance by 7 winning the multiobjective optimization algorithm competition at the 8 CEC 2009. For effective performance of MOEA/D, neighborhood size 9 (NS) parameter has to be tuned. In this letter, an ensemble of different 10 NSs with online self-adaptation is proposed (ENS-MOEA/D) to over...
متن کاملPattern Analysis with Layered Self-Organizing Maps
1. Abstract Abstract— This paper defines a new learning architecture, Layered Self-Organizing Maps (LSOMs), that uses the SOM and supervised-SOM learning algorithms. The architecture is validated with the MNIST database of hand-written digit images. LSOMs are similar to convolutional neural nets (covnets) in the way they sample data, but different in the way they represent features and learn. L...
متن کاملMultiobjective genetic programming for maximizing ROC performance
In binary classification problems, receiver operating characteristic (ROC) graphs are commonly used for visualizing, organizing and selecting classifiers based on their performances. An important issue in the ROC literature is to obtain the ROC convex hull (ROCCH) that covers potentially optima for a given set of classifiers [1]. Maximizing the ROCCH means to maximize the true positive rate (tp...
متن کاملMEDACO: Solving Multiobjective Combinatorial Optimization with Evolution, Decomposition and Ant Colonies
We propose a novel multiobjective evolutionary algorithm, MEDACO, a shorter acronym for MOEA/D-ACO, combining ant colony optimization (ACO) and multiobjective evolutionary algorithm based on decomposition (MOEA/D). The motivation is to use the online-learning capabilities of ACO, according to the Reactive Search Optimization (RSO) paradigm of ”learning while optimizing”, to further improve the ...
متن کامل